Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[7.15] [Reporting] Fix slow CSV with large max size bytes (#120365) #120582

Merged
merged 1 commit into from
Dec 7, 2021

Conversation

jloleysens
Copy link
Contributor

Backports the following commits to 7.15:

* use Buffer.alloc + .set API instead of .concat

* refactor variable names and actually assign to this.buffer

* ok, looks like an array of buffers could work

* added large comment and refactored some variable name

* fix comment

* refactored logic to deal with an edge case where partial buffers should be added, also throw if bad config is detected

* added new test for detecting when the write stream throws for bad config

* updated logic to not ever call .slice(0), updated the guard for the config error, updated a comment

* refactor totalBytesConsumed -> bytesToFlush

* use the while loop mike wrote

* remove unused variable

* update comment

Co-authored-by: Kibana Machine <[email protected]>
# Conflicts:
#	x-pack/plugins/reporting/server/lib/content_stream.ts
@jloleysens jloleysens enabled auto-merge (squash) December 7, 2021 09:24
@kibana-ci
Copy link
Collaborator

💚 Build Succeeded

Metrics [docs]

✅ unchanged

To update your PR or re-run it, just comment with:
@elasticmachine merge upstream

@jloleysens jloleysens merged commit 976008e into elastic:7.15 Dec 7, 2021
@jloleysens jloleysens deleted the backport/7.15/pr-120365 branch December 7, 2021 14:10
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants